Convergence of an inertial proximal method for l1-regularized least-squares
نویسندگان
چکیده
A fast, low-complexity, algorithm for solving the `1-regularized least-squares problem is devised and analyzed. Our algorithm, which we call the Inertial Iterative Soft-Thresholding Algorithm (I-ISTA), incorporates inertia into a forward-backward proximal splitting framework. We show that I-ISTA has a linear rate of convergence with a smaller asymptotic error constant than the well-known Iterative Shrinkage/Soft-Thresholding Algorithm (ISTA) for solving `1-regularized least-squares. The improvement in asymptotic error constant over ISTA is significant on ill-conditioned problems and is gained with minor additional computations. We conduct numerical experiments which show that I-ISTA converges more quickly than ISTA and two other computationally comparable algorithms on compressed sensing and deconvolution problems.
منابع مشابه
A Proximal-Gradient Homotopy Method for the L1-Regularized Least-Squares Problem
We consider the `1-regularized least-squares problem for sparse recovery and compressed sensing. Since the objective function is not strongly convex, standard proximal gradient methods only achieve sublinear convergence. We propose a homotopy continuation strategy, which employs a proximal gradient method to solve the problem with a sequence of decreasing regularization parameters. It is shown ...
متن کاملConvergence of Common Proximal Methods for L1-Regularized Least Squares
We compare the convergence behavior of ADMM (alternating direction method of multipliers), [F]ISTA ([fast] iterative shrinkage and thresholding algorithm) and CD (coordinate descent) methods on the model `1-regularized least squares problem (aka LASSO). We use an eigenanalysis of the operators to compare their local convergence rates when close to the solution. We find that, when applicable, CD...
متن کاملA Fast Dual Projected Newton Method for l1-Regularized Least Squares
L1-regularized least squares, with the ability of discovering sparse representations, is quite prevalent in the field of machine learning, statistics and signal processing. In this paper, we propose a novel algorithm called Dual Projected Newton Method (DPNM) to solve the 1-regularized least squares problem. In DPNM, we first derive a new dual problem as a box constrained quadratic programming....
متن کاملDecomposable norm minimization with proximal-gradient homotopy algorithm
We study the convergence rate of the proximal-gradient homotopy algorithm applied to normregularized linear least squares problems, for a general class of norms. The homotopy algorithm reduces the regularization parameter in a series of steps, and uses a proximal-gradient algorithm to solve the problem at each step. Proximal-gradient algorithm has a linear rate of convergence given that the obj...
متن کاملFast Reconstruction Algorithm for Perturbed Compressive Sensing Based on Total Least-Squares and Proximal Splitting
We consider the problem of finding a sparse solution for an underdetermined linear system of equations when the known parameters on both sides of the system are subject to perturbation. This problem is particularly relevant to reconstruction in fully-perturbed compressive-sensing setups where both the projected measurements of an unknown sparse vector and the knowledge of the associated project...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015